63 research outputs found

    On the parametrization of clapping

    Get PDF
    For a Reactive Virtual Trainer(RVT), subtle timing and lifelikeness\ud of motion is of primary importance. To allow for reactivity, movement\ud adaptation, like a change of tempo, is necessary. In this paper we\ud investigate the relation between movement tempo, its synchronization to\ud verbal counting, time distribution, amplitude, and left-right symmetry of\ud a clapping movement. We analyze motion capture data of two subjects\ud performing a clapping exercise, both freely and timed by a metronome.\ud Our findings are compared to existing gesture research and existing biomechanical models. We found that, for our subjects, verbal counting adheres\ud to the phonological synchrony rule. A linear relationship between\ud the movement path length and the tempo was found. The symmetry between\ud the left and the right hand can be described by the biomechanical\ud model of two coupled oscillators

    A Demonstration of Continuous Interaction with Elckerlyc

    Get PDF
    We discuss behavior planning in the style of the SAIBA framework for continuous (as opposed to turn-based) interaction. Such interaction requires the real-time application of minor shape or timing modifications of running behavior and anticipation of behavior of a (human) interaction partner. We discuss how behavior (re)planning and on-the-fly parameter modification fit into the current SAIBA framework, and what type of language or architecture extensions might be necessary. Our BML realizer Elckerlyc provides flexible mechanisms for both the specification and the execution of modifications to running behavior. We show how these mechanisms are used in a virtual trainer and two turn taking scenarios

    Establishing Rapport with a Virtual Dancer

    Get PDF
    We discuss an embodied agent that acts as a dancer and invites human partners to dance with her. The dancer has a repertoire of gestures and moves obtained from inverse kinematics and motion capturing that can be combined in order to dance both on the beat of the music that is provided to the dancer and sensor input (visual and dance pad) from a human partner made available to the virtual dancer. The interaction between virtual dancer and human dancer allows alternating ‘lead’ ad ‘follow’ behavior, both from the point of view of the virtual and the human dancer

    Leading and following with a virtual trainer

    Get PDF
    This paper describes experiments with a virtual fitness trainer capable of mutually coordinated interaction. The virtual human co-exercises along with the user, leading as well as following in tempo, to motivate the user and to influence the speed with which the user performs the exercises. In a series of three experiments (20 participants in total) we attempted to influence the users' performance by manipulating the (timing of the) exercise behavior of the virtual trainer. The results show that it is possible to do this implicitly, using only micro adjustments to its bodily behavior. As such, the system is a rst step in the direction of mutually coordinated bodily interaction for virtual humans

    OpenBMLParser: An Open Source BML Parser/Analyzer

    Get PDF
    van Welbergen H. OpenBMLParser: An Open Source BML Parser/Analyzer. In: Aylett R, Krenn B, Pelachaud C, Shimodaira H, eds. Intelligent Virtual Agents. Lecture Notes in Computer Science. Vol 8108. Springer; 2013: 432-433

    AsapRealizer 2.0: The Next Steps in Fluent Behavior Realization for ECAs

    Get PDF
    van Welbergen H, Yaghoubzadeh R, Kopp S. AsapRealizer 2.0: The Next Steps in Fluent Behavior Realization for ECAs. In: Bickmore T, Marsella S, Sidner C, eds. Intelligent Virtual Agents. Lecture Notes in Computer Science. Vol 8637. Cham: Springer International Publishing; 2014: 449-462.Natural human interaction is highly dynamic and responsive: interlocutors produce utterances incrementally, smoothly switch speaking turns with virtually no delay, make use of on-the-fly adaptation and (self) interruptions, execute movement in tight synchrony, etc. We present the conglomeration of our research efforts in enabling the realization of such fluent interactions for Embodied Conversational Agents in the behavior realizer ‘AsapRealizer 2.0’ and show how it provides fluent realization capabilities that go beyond the state-of-the-art

    Architectures and Standards for IVAs at the Social Cognitive Systems Group

    Get PDF

    Incremental, Adaptive and Interruptive Speech Realization for Fluent Conversation with ECAs

    Get PDF
    van Welbergen H, Baumann T, Kopp S, Schlangen D. Incremental, Adaptive and Interruptive Speech Realization for Fluent Conversation with ECAs. In: Aylett R, Krenn B, Pelachaud C, Shimodaira H, eds. Intelligent Virtual Agents. Lecture Notes in Computer Science. Vol 8108. Springer; 2013: 468-469
    • …
    corecore